Statistical distance

In statistics, probability theory, and information theory, a statistical distance quantifies the distance between two statistical objects, which can be two samples, two random variables, or two probability distributions, for example.

Contents

Metrics

A metric on a set X is a function (called the distance function or simply distance)

d : X × XR

(where R is the set of real numbers). For all x, y, z in X, this function is required to satisfy the following conditions:

  1. d(x, y) ≥ 0     (non-negativity)
  2. d(x, y) = 0   if and only if   x = y     (identity of indiscernibles. Note that condition 1 and 2 together produce positive definiteness)
  3. d(x, y) = d(y, x)     (symmetry)
  4. d(x, z) ≤ d(x, y) + d(y, z)     (subadditivity / triangle inequality).

Distances: Generalized metrics

Many statistical distances are not metrics, because they lack one or more properties of proper metrics. For example, pseudometrics can violate the "positive definiteness" (alternatively, "identity of indescernibles" property); quasimetrics can violate the symmetry property; and semimetrics can violate the triangle inequality. Some statistical distances are referred to as divergences.

Examples

Some important statistical distances include the following:

Other approaches

See also

Notes

References